36 research outputs found

    Convergence of multi-dimensional quantized SDESDE's

    Get PDF
    We quantize a multidimensional SDESDE (in the Stratonovich sense) by solving the related system of ODEODE's in which the dd-dimensional Brownian motion has been replaced by the components of functional stationary quantizers. We make a connection with rough path theory to show that the solutions of the quantized solutions of the ODEODE converge toward the solution of the SDESDE. On our way to this result we provide convergence rates of optimal quantizations toward the Brownian motion for 1q\frac 1q-H\" older distance, q>2q>2, in Lp(¶)L^p(\P).Comment: 43 page

    Minimal L-space and Halmos-Savage criterion for majorized experiments

    Full text link

    Functional co-monotony of processes with applications to peacocks and barrier options

    Get PDF
    We show that several general classes of stochastic processes satisfy a functional co-monotony principle, including processes with independent increments, Brownian diffusions, Liouville processes. As a first application, we recover some recent results about peacock processes obtained by Hirsch et al. which were themselves motivated by a former work of Carr et al. about the sensitivity of Asian Call options with respect to their volatility and residual maturity (seniority). We also derive semi-universal bounds for various barrier options.Comment: 27 page

    Quadratic optimal functional quantization of stochastic processes and numerical applications

    Get PDF
    In this paper, we present an overview of the recent developments of functional quantization of stochastic processes, with an emphasis on the quadratic case. Functional quantization is a way to approximate a process, viewed as a Hilbert-valued random variable, using a nearest neighbour projection on a finite codebook. A special emphasis is made on the computational aspects and the numerical applications, in particular the pricing of some path-dependent European options.Comment: 41 page

    Theoretical Properties of Projection Based Multilayer Perceptrons with Functional Inputs

    Get PDF
    Many real world data are sampled functions. As shown by Functional Data Analysis (FDA) methods, spectra, time series, images, gesture recognition data, etc. can be processed more efficiently if their functional nature is taken into account during the data analysis process. This is done by extending standard data analysis methods so that they can apply to functional inputs. A general way to achieve this goal is to compute projections of the functional data onto a finite dimensional sub-space of the functional space. The coordinates of the data on a basis of this sub-space provide standard vector representations of the functions. The obtained vectors can be processed by any standard method. In our previous work, this general approach has been used to define projection based Multilayer Perceptrons (MLPs) with functional inputs. We study in this paper important theoretical properties of the proposed model. We show in particular that MLPs with functional inputs are universal approximators: they can approximate to arbitrary accuracy any continuous mapping from a compact sub-space of a functional space to R. Moreover, we provide a consistency result that shows that any mapping from a functional space to R can be learned thanks to examples by a projection based MLP: the generalization mean square error of the MLP decreases to the smallest possible mean square error on the data when the number of examples goes to infinity
    corecore